Network of interacting neurons with random synaptic weights

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast decorrelated neural network ensembles with random weights

Negative correlation learning (NCL) aims to produce ensembles with sound generalization capability through controlling the disagreement among base learners’ outputs. Such a learning scheme is usually implemented by using feed-forward neural networks with error back-propagation algorithms (BPNNs). However, it suffers from slow convergence, local minima problem and model uncertainties caused by t...

متن کامل

Random synaptic feedback weights support error backpropagation for deep learning

The brain processes information through multiple layers of neurons. This deep architecture is representationally powerful, but complicates learning because it is difficult to identify the responsible neurons when a mistake is made. In machine learning, the backpropagation algorithm assigns blame by multiplying error signals with all the synaptic weights on each neuron's axon and further downstr...

متن کامل

Dynamics of a random neural network with synaptic depression

Abstract--We consider a randomly connected neural network with linear threshoM elements which update in discrete time steps. The two main features o f the network are: (1) equally distributed and purely excitatory connections and (2) synaptic depression after repetitive firing. We focus on the time evolution of the expected network activity. The four types of qualitative behavior are investigat...

متن کامل

Evolved Asymmetry and Dilution of Random Synaptic Weights in Hop eld Network Turn a Spin-glass Phase into Associative Memory

We apply evolutionary computations to Hop eld's neural network model of associative memory. Previously, we reported that a genetic algorithm can enlarge the Hebb rule associative memory by pruning some of over-loaded Hebbian synaptic weights. In this paper, we present the genetic algorithm also evolves random synaptic weights to store some number of patterns.

متن کامل

Stochastic Online Learning for Network Optimization under Random Unknown Weights

We consider network optimization problems under random weights with unknown distributions. We first consider the shortest path problem that aims to optimize the quality of communication between a source and a destination through adaptive path selection. Due to the randomness and uncertainties in the network dynamics, the state of each communication link varies over time according to a stochasti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ESAIM: Proceedings and Surveys

سال: 2019

ISSN: 2267-3059

DOI: 10.1051/proc/201965445